Fast Relative-Error Approximation Algorithm for Ridge Regression

نویسندگان

  • Shouyuan Chen
  • Yang Liu
  • Michael R. Lyu
  • Irwin King
  • Shengyu Zhang
چکیده

Ridge regression is one of the most popular and effective regularized regression methods, and one case of particular interest is that the number of features p is much larger than the number of samples n, i.e. p n. In this case, the standard optimization algorithm for ridge regression computes the optimal solution x⇤ in O(n2p + n3) time. In this paper, we propose a fast relativeerror approximation algorithm for ridge regression. More specifically, our algorithm outputs a solution ̃ x satisfying k ̃ x x⇤k

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

LOCO: Distributing Ridge Regression with Random Projections

We propose LOCO, a distributed algorithm which solves large-scale ridge regression. LOCO randomly assigns variables to different processing units which do not communicate. Important dependencies between variables are preserved using random projections which are cheap to compute. We show that LOCO has bounded approximation error compared to the exact ridge regression solution in the fixed design...

متن کامل

Ridge Regression and Provable Deterministic Ridge Leverage Score Sampling

Ridge leverage scores provide a balance between low-rank approximation and regularization, and are ubiquitous in randomized linear algebra and machine learning. Deterministic algorithms are also of interest in the moderately big data regime, because deterministic algorithms provide interpretability to the practitioner by having no failure probability and always returning the same results. We pr...

متن کامل

Sampling Circulant Matrix Approach: A Comparison of Recent Kernel Matrix Approximation Techniques in Ridge Kernel Regression

As part of a survey of state-of-the-art kernel approximation algorithms, we present a new sampling algorithm for circulant matrix construction to perform fast kernel matrix inversion in kernel ridge regression, comparing theoretical and experimental performance of that of multilevel circulant kernel approximation, incomplete Cholesky decomposition, and random features, all recent advances in th...

متن کامل

Regularized Laplacian Estimation and Fast Eigenvector Approximation

Recently, Mahoney and Orecchia demonstrated that popular diffusion-based procedures to compute a quick approximation to the first nontrivial eigenvector of a data graph Laplacian exactly solve certain regularized Semi-Definite Programs (SDPs). In this paper, we extend that result by providing a statistical interpretation of their approximation procedure. Our interpretation will be analogous to ...

متن کامل

Two-Parameters Fuzzy Ridge Regression with Crisp Input and Fuzzy Output

‎In this paper a new weighted fuzzy ridge regression method for a given set of crisp input and triangular fuzzy output values is proposed‎. ‎In this regard‎, ‎ridge estimator of fuzzy parameters is obtained for regression model and its prediction error is calculated by using the weighted fuzzy norm of crisp ridge coefficients‎. . ‎To evaluate the proposed regression model‎, ‎we introduce the fu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015